PULNS: Positive-Unlabeled Learning with Effective Negative Sample Selector
نویسندگان
چکیده
Positive-unlabeled learning (PU learning) is an important case of binary classification where the training data only contains positive and unlabeled samples. The current state-of-the-art approach for PU cost-sensitive approach, which casts as a problem relies on unbiased risk estimator correcting bias introduced by However, this requires knowledge class prior subject to potential label noise. In paper, we propose novel dubbed PULNS, equipped with effective negative sample selector, optimized reinforcement learning. Our PULNS employs selector agent responsible selecting samples from data. While selected, likely can be used improve classifier, performance classifier also reward through REINFORCE algorithm. By alternating updates both improved. Extensive experimental studies 7 real-world application benchmarks demonstrate that consistently outperforms methods in learning, our results confirm effectiveness underlying PULNS.
منابع مشابه
Theoretical Comparisons of Positive-Unlabeled Learning against Positive-Negative Learning
In PU learning, a binary classifier is trained from positive (P) and unlabeled (U) data without negative (N) data. Although N data is missing, it sometimes outperforms PN learning (i.e., ordinary supervised learning). Hitherto, neither theoretical nor experimental analysis has been given to explain this phenomenon. In this paper, we theoretically compare PU (and NU) learning against PN learning...
متن کاملTheoretical Comparisons of Learning from Positive-Negative, Positive-Unlabeled, and Negative-Unlabeled Data
In PU learning, a binary classifier is trained from positive (P) and unlabeled (U) data without negative (N) data. Although N data is missing, it sometimes outperforms PN learning (i.e., ordinary supervised learning). Hitherto, neither theoretical nor experimental analysis has been given to explain this phenomenon. In this paper, we theoretically compare PU (and NU) learning against PN learning...
متن کاملPositive-Unlabeled Learning with Non-Negative Risk Estimator
From only positive (P) and unlabeled (U) data, a binary classifier could be trained with PU learning, in which the state of the art is unbiased PU learning. However, if its model is very flexible, empirical risks on training data will go negative, and we will suffer from serious overfitting. In this paper, we propose a non-negative risk estimator for PU learning: when getting minimized, it is m...
متن کاملMulti-Positive and Unlabeled Learning
Yixing Xu†, Chang Xu‡, Chao Xu†, Dacheng Tao‡ †Key Laboratory of Machine Perception (MOE), Cooperative Medianet Innovation Center, School of Electronics Engineering and Computer Science, PKU, Beijing 100871, China ‡UBTech Sydney AI Institute, The School of Information Technologies, The University of Sydney, J12, 1 Cleveland St, Darlington, NSW 2008, Australia [email protected], [email protected]...
متن کاملPositive-Unlabeled Learning for Pupylation Sites Prediction
Pupylation plays a key role in regulating various protein functions as a crucial posttranslational modification of prokaryotes. In order to understand the molecular mechanism of pupylation, it is important to identify pupylation substrates and sites accurately. Several computational methods have been developed to identify pupylation sites because the traditional experimental methods are time-co...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2021
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v35i10.17064